Permutation entropy revisited

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Permutation entropy

Permutation entropy is computationally efficient, robust to outliers, and effective to measure complexity of time series. We used this technique to quantify the complexity of continuous vital signs recorded from patients with traumatic brain injury (TBI). Using permutation entropy calculated from early vital signs (initial 10~20% of patient hospital stay time), we built classifiers to predict i...

متن کامل

Renyi Entropy Estimation Revisited

We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size n. For estimating Renyi entropy of order α, up to constant accuracy and error probability, we show the following Upper bounds n = O(1) · 2(1− 1 α )Hα for integer α...

متن کامل

Configurational Entropy Revisited

1548 Journal of Chemical Education • Vol. 84 No. 9 September 2007 • www.JCE.DivCHED.org Entropy change is categorized in some prominent general chemistry textbooks as being either positional (configurational) or thermal. Positional entropy focuses on the number of positions in space that can be occupied by the molecules of a system. Then, to the extent that more positions exist after a process ...

متن کامل

Entropy and Inference, Revisited

We study properties of popular near–uniform (Dirichlet) priors for learning undersampled probability distributions on discrete nonmetric spaces and show that they lead to disastrous results. However, an Occam–style phase space argument expands the priors into their infinite mixture and resolves most of the observed problems. This leads to a surprisingly good estimator of entropies of discrete d...

متن کامل

Shannon's entropy revisited

I consider the effect of a finite sample size on the entropy of a sample of independent events. I propose formula for entropy which satisfies Shannon’s axioms, and which reduces to Shannon’s entropy when sample size is infinite. I discuss the physical meaning of the difference between two formulas, including some practical implications, such as maximum achievable channel utilization, and minimu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Chaos, Solitons & Fractals

سال: 2019

ISSN: 0960-0779

DOI: 10.1016/j.chaos.2018.12.039